Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher.
Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?
Some links on this page may take you to non-federal websites. Their policies may differ from this site.
-
Understanding how individuals focus and perform visual searches during collaborative tasks can help improve user engagement. Eye tracking measures provide informative cues for such understanding. This article presents A-DisETrac, an advanced analytic dashboard for distributed eye tracking. It uses off-the-shelf eye trackers to monitor multiple users in parallel, compute both traditional and advanced gaze measures in real-time, and display them on an interactive dashboard. Using two pilot studies, the system was evaluated in terms of user experience and utility, and compared with existing work. Moreover, the system was used to study how advanced gaze measures such as ambient-focal coefficient K and real-time index of pupillary activity relate to collaborative behavior. It was observed that the time a group takes to complete a puzzle is related to the ambient visual scanning behavior quantified and groups that spent more time had more scanning behavior. User experience questionnaire results suggest that their dashboard provides a comparatively good user experience.more » « less
-
A large body of literature documents the sensitivity of pupil response to cognitive load (e.g., Krejtz et al. 2018) and emotional arousal (Bradley et al., 2008). Recent empirical evidence also showed that microsaccade characteristics and dynamics can be modulated by mental fatigue and cognitive load (e.g., Dalmaso et al. 2017). Very little is known about the sensitivity of microsaccadic characteristics to emotional arousal. The present paper demonstrates in a controlled experiment pupillary and microsaccadic responses to information processing during multi-attribute decision making under affective priming. Twenty-one psychology students were randomly assigned into three affective priming conditions (neutral, aversive, and erotic). Participants were tasked to make several discriminative decisions based on acquired cues. In line with the expectations, results showed microsaccadic rate inhibition and pupillary dilation depending on cognitive effort (number of acquired cues) prior to decision. These effects were moderated by affective priming. Aversive priming strengthened pupillary and microsaccadic response to information processing effort. In general, results suggest that pupillary response is more biased by affective priming than microsaccadic rate. The results are discussed in the light of neuropsychological mechanisms of pupillary and microsaccadic behavior generation.more » « less
-
A novel eye-tracked measure of pupil diameter oscillation is derived as an indicator of cognitive load. The new metric, termed the Low/High Index of Pupillary Activity (LHIPA), is able to discriminate cognitive load (vis-à-vis task difficulty) in several experiments where the Index of Pupillary Activity fails to do so. Rationale for the LHIPA is tied to the functioning of the human autonomic nervous system yielding a hybrid measure based on the ratio of Low/High frequencies of pupil oscillation. The paper’s contribution is twofold. First, full documentation is provided for the calculation of the LHIPA. As with the IPA, it is possible for researchers to apply this metric to their own experiments where a measure of cognitive load is of interest. Second, robustness of the LHIPA is shown in analysis of three experiments, a restrictive fixed-gaze number counting task, a less restrictive fixed-gaze n-back task, and an applied eye-typing task.more » « less
-
Abstract The enhancement hypothesis suggests that deaf individuals are more vigilant to visual emotional cues than hearing individuals. The present eye-tracking study examined ambient–focal visual attention when encoding affect from dynamically changing emotional facial expressions. Deaf (n = 17) and hearing (n = 17) individuals watched emotional facial expressions that in 10-s animations morphed from a neutral expression to one of happiness, sadness, or anger. The task was to recognize emotion as quickly as possible. Deaf participants tended to be faster than hearing participants in affect recognition, but the groups did not differ in accuracy. In general, happy faces were more accurately and more quickly recognized than faces expressing anger or sadness. Both groups demonstrated longer average fixation duration when recognizing happiness in comparison to anger and sadness. Deaf individuals directed their first fixations less often to the mouth region than the hearing group. During the last stages of emotion recognition, deaf participants exhibited more focal viewing of happy faces than negative faces. This pattern was not observed among hearing individuals. The analysis of visual gaze dynamics, switching between ambient and focal attention, was useful in studying the depth of cognitive processing of emotional information among deaf and hearing individuals.more » « less
-
We develop an approach to using microsaccade dynamics for the measurement of task difficulty/cognitive load imposed by a visual search task of a layered surface. Previous studies provide converging evidence that task difficulty/cognitive load can influence microsaccade activity. We corroborate this notion. Specifically, we explore this relationship during visual search for features embedded in a terrain-like surface, with the eyes allowed to move freely during the task. We make two relevant contributions. First, we validate an approach to distinguishing between the ambient and focal phases of visual search. We show that this spectrum of visual behavior can be quantified by a single previously reported estimator, known as Krejtz's K coefficient. Second, we use ambient/focal segments based on K as a moderating factor for microsaccade analysis in response to task difficulty. We find that during the focal phase of visual search (a) microsaccade magnitude increases significantly, and (b) microsaccade rate decreases significantly, with increased task difficulty. We conclude that the combined use of K and microsaccade analysis may be helpful in building effective tools that provide an indication of the level of cognitive activity within a task while the task is being performed.more » « less
-
In this paper I review gaze-based interaction, distinguishing eye movement analysis from synthesis in virtual reality and games for serious applications. My focus is on four forms of gaze-based interaction: diagnostic, active, passive, and expressive. In discussing each, I briefly review seminal results and recent advancements, highlighting outstanding research problems.more » « less
An official website of the United States government
